60 research outputs found

    Compression image sharing using DCT- Wavelet transform and coding by Blackely method

    Get PDF
    The increased use of computer and internet had been related to the wide use of multimedia information. The requirement forprotecting this information has risen dramatically. To prevent the confidential information from being tampered with, one needs toapply some cryptographic techniques. Most of cryptographic strategies have one similar weak point that is the information is centralized.To overcome this drawback the secret sharing was introduced. It’s a technique to distribute a secret among a group of members, suchthat every member owns a share of the secret; but only a particular combination of shares could reveal the secret. Individual sharesreveal nothing about the secret. The major challenge faces image secret sharing is the shadow size; that's the complete size of the lowestneeded of shares for revealing is greater than the original secret file. So the core of this work is to use different transform codingstrategies in order to get as much as possible the smallest share size. In this paper Compressive Sharing System for Images UsingTransform Coding and Blackely Method based on transform coding illustration are introduced. The introduced compressive secretsharing scheme using an appropriate transform (Discrete cosine transform and Wavelet) are applied to de-correlate the image samples,then feeding the output (i.e., compressed image data) to the diffusion scheme which is applied to remove any statistical redundancy orbits of important attribute that will exist within the compressed stream and in the last the (k, n) threshold secret sharing scheme, where nis the number of generated shares and k is the minimum needed shares for revealing. For making a certain high security level, eachproduced share is passed through stream ciphering depends on an individual encryption key belongs to the shareholder

    User Identification and Verification from a Pair of Simultaneous EEG Channels Using Transform Based Features

    Get PDF
    In this study, the approach of combined features from two simultaneous Electroencephalogram (EEG) channels when a user is performing a certain mental task is discussed to increase the discrimination degree among subject classes, hence the visibility of using sets of features extracted from a single channel was investigated in previously published articles. The feature sets considered in previous studies is utilized to establish a combined set of features extracted from two channels. The first feature set is the energy density of power spectra of Discrete Fourier Transform (DFT) or Discrete Cosine Transform; the second one is the set of statistical moments of Discrete Wavelet Transform (DWT). Euclidean distance metric is used to accomplish feature set matching task. The combinations of features from two EEG channels showed high accuracy for the identification system, and competitive results for the verification system. The best achieved identification accuracy is (100%) for all proposed feature sets. For verification mode the best achieved Half Total Error Rate (HTER) is (0.88) with accuracy (99.12%) on Colorado State University (CSU) dataset, and (0.26) with accuracy (99.97%) on Motor Movement/Imagery (MMI) dataset

    THE USE OF ROUGH CLASSIFICATION AND TWO THRESHOLD TWO DIVISORS FOR DEDUPLICATION

    Get PDF
    The data deduplication technique efficiently reduces and removes redundant data in big data storage systems. The main issue is that the data deduplication requires expensive computational effort to remove duplicate data due to the vast size of big data. The paper attempts to reduce the time and computation required for data deduplication stages. The chunking and hashing stage often requires a lot of calculations and time. This paper initially proposes an efficient new method to exploit the parallel processing of deduplication systems with the best performance. The proposed system is designed to use multicore computing efficiently. First, The proposed method removes redundant data by making a rough classification for the input into several classes using the histogram similarity and k-mean algorithm. Next, a new method for calculating the divisor list for each class was introduced to improve the chunking method and increase the data deduplication ratio. Finally, the performance of the proposed method was evaluated using three datasets as test examples. The proposed method proves that data deduplication based on classes and a multicore processor is much faster than a single-core processor. Moreover, the experimental results showed that the proposed method significantly improved the performance of Two Threshold Two Divisors (TTTD) and Basic Sliding Window BSW algorithms

    Image Compression Using Tap 9/7 Wavelet Transform and Quadtree Coding Scheme

    Get PDF
    This paper is concerned with the design and implementation of an image compression method based on biorthogonal tap-9/7 discrete wavelet transform (DWT) and quadtree coding method. As a first step the color correlation is handled using YUV color representation instead of RGB. Then, the chromatic sub-bands are downsampled, and the data of each color band is transformed using wavelet transform. The produced wavelet sub-bands are quantized using hierarchal scalar quantization method. The detail quantized coefficient is coded using quadtree coding followed by Lempel-Ziv-Welch (LZW) encoding. While the approximation coefficients are coded using delta coding followed by LZW encoding. The test results indicated that the compression results are comparable to those gained by standard compression schemes

    An efficient method for stamps recognition using Haar wavelet sub-bands

    Get PDF
    The problem facing certain organizations such as insurance companies and government institutions where a huge amount of documents is handled every day, hence an automated stamp recognition system is required. The image of the stamp may be on a different background, with different sizes, and suffers from rotating in different directions, also, the appearance of soft areas (patches) or small points as noise. Thus, the main objective of this paper is to extract and recognize the color stamp image. This paper proposed a method to recognize stamps, by using a technique named Haar wavelet sub-bands. The devised method has four stages: 1) extracts the stamp image; 2) preprocessing the image; 3) feature extraction; and 4) matching. This paper is implemented using C sharp (Microsoft Visual Studio 2012) programming language. The experiments conducted on a stamp dataset showed that the proposed method has a great capability to recognize stamps when using Haar wavelet transform with two sets of features (i.e., 100% recognition rate for energy features and 99.93% recognition rate for low order moment)

    Color image compression based on spatial and magnitude signal decomposition

    Get PDF
    In this paper, a simple color image compression system has been proposed using image signal decomposition. Where, the RGB image color band is converted to the less correlated YUV color model and the pixel value (magnitude) in each band is decomposed into 2-values; most and least significant. According to the importance of the most significant value (MSV) that influenced by any simply modification happened, an adaptive lossless image compression system is proposed using bit plane (BP) slicing, delta pulse code modulation (Delta PCM), adaptive quadtree (QT) partitioning followed by an adaptive shift encoder. On the other hand, a lossy compression system is introduced to handle the least significant value (LSV), it is based on an adaptive, error bounded coding system, and it uses the DCT compression scheme. The performance of the developed compression system was analyzed and compared with those attained from the universal standard JPEG, and the results of applying the proposed system indicated its performance is comparable or better than that of the JPEG standards

    A Comparative Study for String Metrics and the Feasibility of Joining them as Combined Text Similarity Measures

    Get PDF
    This paper aims to introduce an optimized Damerau–Levenshtein and dice-coefficients using enumeration operations (ODADNEN) for providing fast string similarity measure with maintaining the results accuracy; searching to find specific words within a large text is a hard job which takes a lot of time and efforts. The string similarity measure plays a critical role in many searching problems. In this paper, different experiments were conducted to handle some spelling mistakes. An enhanced algorithm for string similarity assessment was proposed. This algorithm is a combined set of well-known algorithms with some improvements (e.g. the dice-coefficient was modified to deal with numbers instead of characters using certain conditions). These algorithms were adopted after conducting on a number of experimental tests to check its suitability. The ODADNN algorithm was tested using real data; its performance was compared with the original similarity measure. The results indicated that the most convincing measure is the proposed hybrid measure, which uses the Damerau–Levenshtein and dicedistance based on n-gram of each word to handle; also, it requires less processing time in comparison with the standard algorithms. Furthermore, it provides efficient results to assess the similarity between two words without the need to restrict the word length
    • …
    corecore